Neural Dynamics of Learning and Performance of Fixed Sequences: Latency Pattern Reorganizations and the N-STREAMS Model

نویسندگان

  • Bradley Rhodes
  • Daniel Bullock
چکیده

Fixed se<.1uences perfon11ed fron1 n1en1ory play a key role in lnunan cultural behavior, especially in 1nusic and in rapid conununication through speaking, handwriting, and typing~ Upon first perfonnancc, fixed sequences are often produced slowly, but cxt·ensive practice leads to perfonnance that is both l1uid and as rapid as allowed by constraints inherent in the task or the performer. The experin1ental st·udy of fixed se<-Iuencc learning and production has generated a large database with so1ne challenging findings, including practice-related reorganizations of tetnporal properties of performance. In this paper, we analyze this literature and identify a coherent set of robust experimental effects. Among these arc both the .reqm11ce lnzgt!J effect 011 latemy, a dependence of rcacl'ion time on sequence length, and jmu'liw-dej!ellrlmt /o.r.r of t!Je lnzgth efjfct 011 latemy. We then introduce a neural network architecture capable of explaining these effect's. Called the NSTREAMS model, this multi-module archit'ecture embodies the hypothesis !'hat !'he brain uses several substrates for serial order representat-ion and learning. The. d1t'.ory describes three such substrates and how learning aut·onOinously n1odifics t·hcir interact-ion over the course of pract·ice. A key feature of the architecture is the co-operation of a 'con1pct-itive queuing' perforn1ance n1cchanisrn with bor-l1 fundan1entally parallel ('priority-tagged') an(i fim(lainent:ally sequential ('chailllike') representations of serial order. A neurobiological intcrprct·at"ion of d1c architecture suggests how different parts of t·he brain divide the labor for serial learning and performance. Rhodes (1999) present·s a con1plete n1at"l1Cmatical n1odcl as in1pletnentation of the architecture, and reports successful simulations of d1c rnajor cxperin1e1Hal cffeus. It also highlights how the network tnechanisn1s incorporated in d1e architecture con1parc and contrast with earlier substrates proposed for con1pet'it·ive quelling, priority t·ag-g-ing an(i response chaining. 2 Neural basis of sequence learning and performance Neural Dynamics of Learning and Performance of Fixed Sequences: Latency Pattern Reorganizations and the N-STREAMS Model. A critical cmnponcnt of hunuu1 culture is an open set of adaptable procedures that are continually being invented, retncn1bcred, generalized, and replicated by individual and social learning tnechanistns. c;eneralizcd procedures often have branch points at which intelligent choice anlOng alternative courses of action requires vigilant deliberation within the evolving internal and external cnvironn1ent. \Xlhen selected courses of action have no furt"her branch points, i.e., when selected actions are fixed n1oven1cnt sequences, then extensive practice leads to perfon11~HlCC that is bod1 l1uid and as rapid as allowed by constraints inherent in the task or the performer. Such fixed sequences, which arc ultin1atcly pcrfon11ed fron1 111C1110ry with reduced or no external cueing, play a fundan1ental role in hun1an cultural behavior, n1ost obviously in tnusic and in rapid con1n1tmication through speaking, han<.hvriting, and typing, but also in sports and 111anual skills. Learned sequences pcrfcm11cd rapidly li:om memory are also exemplified in t·he skilled activity of diverse avian and 111fly, cornposit·ion of anion under n1ncn1onic and context:wll co1Hrol. Classifying Theories of Sequence Learning and Pmduction Before reviewing !"he fon11al chronon1etTic st"udics of int·ercst, it is useful to constTuct a classificat·ion schcn1c i~H· t"heories of sequence learning and production. Such a schen1e should cncotnpass the full range of viable alt-ernative n1odd t-ypes. Arnong d1e n1ost inlport'atll" discussions of basic alternatives is that of Lashley (1951 /1 %0), because he used chronometric data (as well as error data) as part of an argun1e111" about· n1cchanisn1s for sequence production. In particular, L .. ashley used data on the rate of signal conduct-ion in nerve fibc.rs to argue against" stitnulus·-rcsponsc (S-R) chainit1g as the basis f()l· rapid perfonnanccs such as expert typing. ln brief,] ,ashley argued d1at felt contact witl1 the 11th key could not be !1w funnional stinmh1s for the next response, striking key !I+ 1, because expert typists hano already launched stroke 11+ I before the brain can have generated a 1novcn1cnt triggered by sensory feedback from stroke 11. (For caveats, see Abbs, Cracco, & Cole, 1 984; Jacobs & Bullock, 1998). In place of stimulus-response chaining, Lashley argued Llr centrally organized sequences. 'J() allc)w sufficient" f1cxibility for cont"cxl"··(lcpcndcnt rcc)nlering, he also argued for a con1plcx mechanisn1 consisting of two separate processes, one for representing the it"en1s const-ituting a seqtJencc, and t·he other for represent"ing t·he order that should obtain an1ong those iten1s (a separation incorporated in fonnal gran1n1atical theories, cf Borslcy, 1991 ). Since .L.ashlcy's S-R criti<.rue and counterproposal, there have been rnany subsequent proposals, son1e wid1 rnuch greater nx:chanist·ic detail. In one subset· of d1esc, the 11K'.chanistn of response chaining has been revived with various augmentations (1\rbib & Dominey, 1995; Cleercmans & J imionez, 1998;

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two Novel Learning Algorithms for CMAC Neural Network Based on Changeable Learning Rate

Cerebellar Model Articulation Controller Neural Network is a computational model of cerebellum which acts as a lookup table. The advantages of CMAC are fast learning convergence, and capability of mapping nonlinear functions due to its local generalization of weight updating, single structure and easy processing. In the training phase, the disadvantage of some CMAC models is unstable phenomenon...

متن کامل

Intelligent identification of vehicle’s dynamics based on local model network

This paper proposes an intelligent approach for dynamic identification of the vehicles. The proposed approach is based on the data-driven identification and uses a high-performance local model network (LMN) for estimation of the vehicle’s longitudinal velocity, lateral acceleration and yaw rate. The proposed LMN requires no pre-defined standard vehicle model and uses measurement data to identif...

متن کامل

Mining Frequent Patterns in Uncertain and Relational Data Streams using the Landmark Windows

Todays, in many modern applications, we search for frequent and repeating patterns in the analyzed data sets. In this search, we look for patterns that frequently appear in data set and mark them as frequent patterns to enable users to make decisions based on these discoveries. Most algorithms presented in the context of data stream mining and frequent pattern detection, work either on uncertai...

متن کامل

Five-Zone Simulating Moving Bed for Ternary Separation

Five-zone Simulating Moving Bed (SMB) system, designed for ternary separation, is a modified form of standard four-zone SMB which is only effective in binary separation. It was reported in literature that the five-zone SMB separates the extract-II stream with a lower purity value than that of raffinate and extract-I streams. To address this issue, five zone SMB was designed, using safety ma...

متن کامل

Wavelet Neural Network with Random Wavelet Function Parameters

The training algorithm of Wavelet Neural Networks (WNN) is a bottleneck which impacts on the accuracy of the final WNN model. Several methods have been proposed for training the WNNs. From the perspective of our research, most of these algorithms are iterative and need to adjust all the parameters of WNN. This paper proposes a one-step learning method which changes the weights between hidden la...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001